27 - Pattern Recognition (PR) [ID:2729]
50 von 184 angezeigt

The following content has been provided by the University of Erlangen-Nürnberg.

So Professor Honecker will be here today in about 15 minutes, I guess. 15 minutes plus X.

I have no idea how large X actually is. Right now he is still in a meeting with the Chancellor of our University.

So he's still not here. He told me this morning that he has not finished the

independent component analysis section. So we will start here again. I think this

is a pretty nice example of what independent component analysis is all

about. You have two signals. These are statistically independent. So you have a

uniform distribution for S1 and a uniform distribution of S2 and you see

this distribution in this square here. So these are your two source signals and

then you merge these signals. You mix them with mixing matrix A and that's the

result and you see a uniform distribution not on a square but on this parallelogram

right now. Okay that's the effect of the mixing matrix. And now you compute

marginals. You have this joint probability density function P of both

sources S1 and S2 and you compute the integral over S2 for example and then you

get this histogram and this is more or less a straight line. It should be a

straight line if you have more samples here. If you come do the same for this

signal here you have the joint probability of X1 and X2 and you compute

the marginal over X2 then you get this sort of histogram and this looks much

more like a Gaussian distribution than this one here. Okay so here we have a

Gaussian distribution and the Gaussian distribution tells us that we have not

the independent component but a mixture of it. So the idea is to find a

projection. Here we are just projecting on X2 to find a different projection

where it looks not like a Gaussian. That's the algorithm. We have our data. We

apply a centering transformation. That's pretty easy. We apply a widening

transformation so we get decorrelated samples and then we start to compute

independent components. We start with the first independent component. We select a

random vector W, W1. This is a unit vector. The length is 1 and we try to

maximize the non-Gaussianity if we project our samples X onto this vector

W and we are choosing W such that we maximize non-Gaussianity. And if we have

the first independent component then we select the second one, the same procedure

again but we have another constraint. The independent components have to be

perpendicular. And then we compute all our independent components and put them

into a matrix and then we can compute our source signal.

So you compare your probability density function with a Gaussian function and

you have basically two types. You have Subgaussians or Supergaussians. And the

Supergaussians are more spiky and the Subgaussians look, for example, like

this example here. Here we have a uniform distribution. Over there that's a

Laplacian distribution. And the Kurtosis, which is a measure for the Gaussianity,

is negative for Subgaussians and it's positive for Supergaussians and 0 for

a Gaussian. So this I think was the last slide where you stopped last week. We

have this projection here of our data X onto the direction W. The result is Y

and we know X can be computed from the source signals with this mixing

matrix A. And as we are interested in the source signals we can summarize W

transposed times the matrix A to Z transposed. And in the two-dimensional

case that's this sum of these two products here. And now we compute the

Kurtosis of Y and that's basically this term here. Z1 to the power of 4 times

the Kurtosis of S1 plus Z2 to the power of 4 times the Kurtosis of S2. And we

have one constraint. That was the constraint, the expectation value of Y

squared has to be 1. And now we are looking for a solution of this. So we

are again in the two-dimensional case. We have a thick circle here in the

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:29:10 Min

Aufnahmedatum

2013-01-28

Hochgeladen am

2013-01-29 15:55:34

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen